Bayesian computation for statistical models with intractable normalizing constants
نویسندگان
چکیده
This paper deals with a computational aspect of the Bayesian analysis of statistical models with intractable normalizing constants. In the presence of intractable normalizing constants in the likelihood function, traditional MCMC methods cannot be applied. We propose here a general approach to sample from such posterior distributions that bypasses the computation of the normalizing constant. Our method can be thought as a Bayesian version of the MCMC-MLE approach of [8]. To the best of our knowledge, this is the first general and asymptotically consistent Monte Carlo method for such problems, even though [12] has made some progress in this direction. We illustrate our approach on examples from image segmentation and social network modeling. We study as well the asymptotic behavior of the algorithm and obtain a strong law of large numbers for empirical averages. AMS 2000 subject classifications: Primary 60J27, 60J35, 65C40.
منابع مشابه
A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants
Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo...
متن کاملMonte Carlo Methods on Bayesian Analysis of Constrained Parameter Problems with Normalizing Constants
Constraints on the parameters in a Bayesian hierarchical model typically make Bayesian computation and analysis complicated. As Gelfand, Smith and Lee (1992) remarked, it is almost impossible to sample from a posterior distribution when its density contains analytically intractable integrals (normalizing constants) that depend on the (hyper) parameters. Therefore, the Gibbs sampler or the Metro...
متن کاملAdvances in Markov chain Monte Carlo methods
Probability distributions over many variables occur frequently in Bayesian inference, statistical physics and simulation studies. Samples from distributions give insight into their typical behavior and can allow approximation of any quantity of interest, such as expectations or normalizing constants. Markov chain Monte Carlo (MCMC), introduced by Metropolis et al. (1953), allows sampling from d...
متن کاملAn Adaptive Exchange Algorithm for Sampling from Distributions with Intractable Normalizing Constants
An Adaptive Exchange Algorithm for Sampling from Distributions with Intractable Normalizing Constants Faming Liang, Ick Hoon Jin, Qifan Song & Jun S. Liu To cite this article: Faming Liang, Ick Hoon Jin, Qifan Song & Jun S. Liu (2015): An Adaptive Exchange Algorithm for Sampling from Distributions with Intractable Normalizing Constants, Journal of the American Statistical Association, DOI: 10.1...
متن کاملBayesian Sparsity for Intractable Distributions
Bayesian approaches for single-variable and group-structured sparsity outperform L1 regularization, but are challenging to apply to large, potentially intractable models. Here we show how noncentered parameterizations, a common trick for improving the efficiency of exact inference in hierarchical models, can similarly improve the accuracy of variational approximations. We develop this with two ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008